3 - MLPDES25: Eigenvalue problems on graphs and hypergraphs [ID:57419]
50 von 323 angezeigt

So thank you everyone for being here today. Also thank you to Giuseppe and Enrique for

inviting me here and giving me the chance to talk in such a renowned room. Actually

that's the first time I'm here. It looks much better than all the Hörsäle that we have

in our department I think. And also thanks to the organizers, to Lorenzo, Dali and to

Nico for taking care of everything, for all the help that they gave us before this presentation

and also for taking care of the weather. I think these spring wipes are really good for

today and for being here in Erlangen. So as the title of my presentation already reveals,

I will talk a little bit about eigenvalue problems. So it's not yet fully clear what

this has to do with PDEs or machine learning, but I'm thankful that Paula already gave

the presentation before me because I will focus a lot on graphs and also try to motivate

how we could extend these eigenvalue problems to hypergraphs. So I structured the talk into

three sections. I hope the time is fine for that. I will first try to motivate why weighted

graphs are a very interesting universal tool when working with data. And I will give you

some examples on how graphs can be used for data science and machine learning. Then I

will focus in the second part on eigenvalue problems that you probably know already and

I will specifically talk about linear eigenvalue problems that were used quite successfully

in the past and then recently a shift to nonlinear eigenvalue problems both in the continuous

as well as in the discrete setting. Later on, if time allows, I will try to motivate

on how this could be extended to hypergraphs, which is a generalization of weighted graphs

with a little bit more freedom. So first of all, let me talk about finite weighted graphs

and how we can model data on them. So I will give you some examples and I would like to

start with image processing because from my education I was an image processor before,

so this came quite naturally for me. If you look on the left-hand side, this is a colorful

image. I mean, there is some letters and these big things, these should be pixels. I had

to make them a little bit bigger, otherwise we would not see anything. And a graph could

now be used to connect one pixel with its four surrounding neighbors and this kind of

a local graph could then be used to implement traditional methods, for example, finite difference

schemes to work on these images. But with graphs, once you have implemented them, you

can do also crazy things like connect components of an image that are much far away, such as

these known local neighborhoods of a pixel and there you could incorporate much more

than just the geometry of the letters but also semantic information. So here, as you

can see in these different colors, there are parts of the image that belong to each other

semantically, for example, this red patch here is connected to parts of this hat. Here

we have a texture of the hair and these things are then connected and then you can do something

like non-local denoising or non-local segmentation. And it's using the same framework because

you're working on a graph and the graph doesn't care if it's local or non-local. So as we

already seen from Paula, you can also use graphs to somehow represent meshes, either

in the primal sense of the space, you could use the corners of these triangles and the

connecting edges as the graph representation but you could also switch to the dual representation

using, for example, Voronoi diagrams on the surface and then have a representation of

this. So as you can see here, this is kind of a hierarchical surface representation by

a graph and this picture actually is from Gabriel Perey, who will also be here on the

workshop, so I thought this might be a good point to use it here. But now we can even

use graphs on less structured data. Here we have already kind of a surface that is connected

and we know the geometry but if you only have points like in this point cloud and we have

color information, the question is how can you implement any numerical scheme on these

point clouds? Well, one solution would be to use a graph and to build up a graph that

connects all these dots and then you can use your favorite numerical scheme to solve a

PDE or minimize some energy on this point cloud. So this was all now in three dimensions

but why not going up into higher dimensions? So first, here what I would like to show is

Zugänglich über

Offener Zugang

Dauer

00:30:39 Min

Aufnahmedatum

2025-04-28

Hochgeladen am

2025-04-29 15:41:57

Sprache

en-US

#MLPDES25 Machine Learning and PDEs Workshop 
Mon. – Wed. April 28 – 30, 2025
HOST: FAU MoD, Research Center for Mathematics of Data at FAU, Friedrich-Alexander-Universität Erlangen-Nürnberg Erlangen – Bavaria (Germany)
 
SPEAKERS 
• Paola Antonietti. Politecnico di Milano
 • Alessandro Coclite. Politecnico di Bari
 • Fariba Fahroo. Air Force Office of Scientific Research
 • Giovanni Fantuzzi. FAU MoD/DCN-AvH, Friedrich-Alexander-Universität Erlangen-Nürnberg
 • Borjan Geshkovski. Inria, Sorbonne Université
 • Paola Goatin. Inria, Sophia-Antipolis
 • Shi Jin. SJTU, Shanghai Jiao Tong University 
 • Alexander Keimer. Universität Rostock
 • Felix J. Knutson. Air Force Office of Scientific Research
 • Anne Koelewijn. FAU MoD, Friedrich-Alexander-Universität Erlangen-Nürnberg
 • Günter Leugering. FAU, Friedrich-Alexander-Universität Erlangen-Nürnberg
 • Lorenzo Liverani. FAU, Friedrich-Alexander-Universität Erlangen-Nürnberg
 • Camilla Nobili. University of Surrey
 • Gianluca Orlando. Politecnico di Bari
 • Michele Palladino. Università degli Studi dell’Aquila
 • Gabriel Peyré. CNRS, ENS-PSL
 • Alessio Porretta. Università di Roma Tor Vergata
 • Francesco Regazzoni. Politecnico di Milano
 • Domènec Ruiz-Balet. Université Paris Dauphine
 • Daniel Tenbrinck. FAU, Friedrich-Alexander-Universität Erlangen-Nürnberg
 • Daniela Tonon. Università di Padova
 • Juncheng Wei. Chinese University of Hong Kong
 • Yaoyu Zhang. Shanghai Jiao Tong University
 • Wei Zhu. Georgia Institute of Technology
 
SCIENTIFIC COMMITTEE 
• Giuseppe Maria Coclite. Politecnico di Bari
• Enrique Zuazua. FAU MoD/DCN-AvH, Friedrich-Alexander-Universität Erlangen-Nürnberg
 
ORGANIZING COMMITTEE 
• Darlis Bracho Tudares. FAU MoD/DCN-AvH, Friedrich-Alexander-Universität Erlangen-Nürnberg
• Nicola De Nitti. Università di Pisa
• Lorenzo Liverani. FAU DCN-AvH, Friedrich-Alexander-Universität Erlangen-Nürnberg
 
Video teaser of the #MLPDES25 Workshop: https://youtu.be/4sJPBkXYw3M
 
 
#FAU #FAUMoD #MLPDES25 #workshop #erlangen #bavaria #germany #deutschland #mathematics #research #machinelearning #neuralnetworks

Tags

Erlangen mathematics Neural Network PDE Applied Mathematics FAU MoD Partial Differential Equations Bavaria Machine Learning FAU MoD workshop FAU
Einbetten
Wordpress FAU Plugin
iFrame
Teilen